Generalized Symmetric Divergence Measures and the Probability of Error
نویسنده
چکیده
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [6], [8] J-divergence. SibsonBurbea-Rao [9], [3] Jensen-Shannon divegernce and Taneja [11] Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger [5] discrimination, symmetric χ2− divergence, and triangular discrimination are also known in the literature. In this paper, we have considered generalized symmetric divergence measures having the measures given above as particular cases. Bounds on the probability of error are obtained in terms of generalized symmetric divergence measures. Study of bounds on probability of error is extended for the difference of divergence measures.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملSeven Means, Generalized Triangular Discrimination, and Generating Divergence Measures
Jensen-Shannon, J-divergence and Arithmetic-Geometric mean divergences are three classical divergence measures known in the information theory and statistics literature. These three divergence measures bear interesting inequality among the three non-logarithmic measures known as triangular discrimination, Hellingar’s divergence and symmetric chi-square divergence. However, in 2003, Eve studied ...
متن کاملGeneralized Symmetric Divergence Measures and Metric Spaces
Abstract Recently, Taneja [7] studied two one parameter generalizations of J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric divergence. These two generalizations in particular contain measures like: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These measures are well known in the literature of Statistics and Information theory. In thi...
متن کاملBounds on Nonsymmetric Divergence Measure in terms of Other Symmetric and Nonsymmetric Divergence Measures
Vajda (1972) studied a generalized divergence measure of Csiszar's class, so called "Chi-m divergence measure." Variational distance and Chi-square divergence are the special cases of this generalized divergence measure at m = 1 and m = 2, respectively. In this work, nonparametric nonsymmetric measure of divergence, a particular part of Vajda generalized divergence at m = 4, is taken and charac...
متن کاملOn Generalized Cauchy-stieltjes Transforms of Some Beta Distributions
We express the generalized Cauchy-Stieltjes transforms (GCST) of some particular Beta distributions depending on a positive parameter λ as λ-powered Cauchy-Stieltjes transforms (CST) of some probability measures. The CST of the latter measures are shown to be the geometric mean of the CST of the Wigner law together with another one. Moreover, they are absolutely continuous and we derive their d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1103.5218 شماره
صفحات -
تاریخ انتشار 2011